-
1 попарно независимые случайные величины
Mathematics: pairwise independent random variablesУниверсальный русско-английский словарь > попарно независимые случайные величины
См. также в других словарях:
Pairwise independence — In probability theory, a pairwise independent collection of random variables is a set of random variables any two of which are independent. Any collection of mutually independent random variables is pairwise independent, but some pairwise… … Wikipedia
Markov random field — A Markov random field, Markov network or undirected graphical model is a set of variables having a Markov property described by an undirected graph. A Markov random field is similar to a Bayesian network in its representation of dependencies. It… … Wikipedia
Normally distributed and uncorrelated does not imply independent — In probability theory, two random variables being uncorrelated does not imply their independence. In some contexts, uncorrelatedness implies at least pairwise independence (as when the random variables involved have Bernoulli distributions). It… … Wikipedia
Central limit theorem — This figure demonstrates the central limit theorem. The sample means are generated using a random number generator, which draws numbers between 1 and 100 from a uniform probability distribution. It illustrates that increasing sample sizes result… … Wikipedia
Independence (probability theory) — In probability theory, to say that two events are independent intuitively means that the occurrence of one event makes it neither more nor less probable that the other occurs. For example: The event of getting a 6 the first time a die is rolled… … Wikipedia
Karhunen-Loève theorem — In the theory of stochastic processes, the Karhunen Loève theorem (named after Kari Karhunen and Michel Loève) is a representation of a stochastic process as an infinite linear combination of orthogonal functions, analogous to a Fourier series… … Wikipedia
Statistical independence — In probability theory, to say that two events are independent, intuitively means that the occurrence of one event makes it neither more nor less probable that the other occurs. For example:* The event of getting a 6 the first time a die is rolled … Wikipedia
Mutual information — Individual (H(X),H(Y)), joint (H(X,Y)), and conditional entropies for a pair of correlated subsystems X,Y with mutual information I(X; Y). In probability theory and information theory, the mutual information (sometimes known by the archaic term… … Wikipedia
Wiener process — In mathematics, the Wiener process is a continuous time stochastic process named in honor of Norbert Wiener. It is often called Brownian motion, after Robert Brown. It is one of the best known Lévy processes (càdlàg stochastic processes with… … Wikipedia
Lévy process — In probability theory, a Lévy process, named after the French mathematician Paul Lévy, is any continuous time stochastic process that starts at 0, admits càdlàg modification and has stationary independent increments this phrase will be explained… … Wikipedia
Glossary of probability and statistics — The following is a glossary of terms. It is not intended to be all inclusive. Concerned fields *Probability theory *Algebra of random variables (linear algebra) *Statistics *Measure theory *Estimation theory Glossary *Atomic event : another name… … Wikipedia